765 research outputs found
Programming Language Abstractions for Modularly Verified Distributed Systems
Distributed systems are rarely developed as monolithic programs. Instead, like any software, these systems may consist of multiple program components, which are then compiled separately and linked together. Modern systems also incorporate various services interacting with each other and with client applications. However, state-of-the-art verification tools focus predominantly on verifying standalone, closed-world protocols or systems, thus failing to account for the compositional nature of distributed systems. For example, standalone verification has the drawback that when protocols and their optimized implementations evolve, one must re-verify the entire system from scratch, instead of leveraging compositionality to contain the reverification effort.
In this paper, we focus on the challenge of modular verification of distributed systems with respect to high-level protocol invariants as well as for low-level implementation safety properties. We argue that the missing link between the two is a programming paradigm that would allow one to reason about both high-level distributed protocols and low-level implementation primitives in a single verification-friendly framework. Such a link would make it possible to reap the benefits from both the vast body of research in distributed computing, focused on modular protocol decomposition and consistency properties, as well as from the recent advances in program verification, enabling construction of provably correct systems implementations. To showcase the modular verification challenges, we present some typical scenarios of decomposition between a distributed protocol and its implementations. We then describe our ongoing research agenda, in which we are attempting to address the outlined problems by providing a typing discipline and a set of domain-specific primitives for specifying, implementing and verifying distributed systems. Our approach, mechanized within a proof assistant, provides the means of decomposition necessary for modular proofs about distributed protocols and systems
Multiple Comparison Procedures, Trimmed Means And Transformed Statistics
A modification to testing pairwise comparisons that may provide better control of Type I errors in the presence of non-normality is to use a preliminary test for symmetry which determines whether data should be trimmed symmetrically or asymmetrically. Several pairwise MCPs were investigated, employing a test of symmetry with a number of heteroscedastic test statistics that used trimmed means and Winsorized variances. Results showed improved Type I error control than competing robust statistics
Hot Stars and Cool Clouds: The Photodissociation Region M16
We present high-resolution spectroscopy and images of a photodissociation
region (PDR) in M16 obtained during commissioning of NIRSPEC on the Keck II
telescope. PDRs play a significant role in regulating star formation, and M16
offers the opportunity to examine the physical processes of a PDR in detail. We
simultaneously observe both the molecular and ionized phases of the PDR and
resolve the spatial and kinematic differences between them. The most prominent
regions of the PDR are viewed edge-on. Fluorescent emission from nearby stars
is the primary excitation source, although collisions also preferentially
populate the lowest vibrational levels of H2. Variations in density-sensitive
emission line ratios demonstrate that the molecular cloud is clumpy, with an
average density n = 3x10^5 cm^(-3). We measure the kinetic temperature of the
molecular region directly and find T_H2 = 930 K. The observed density,
temperature, and UV flux imply a photoelectric heating efficiency of 4%. In the
ionized region, n_i=5x10^3 cm^(-3) and T_HII = 9500 K. In the brightest regions
of the PDR, the recombination line widths include a non-thermal component,
which we attribute to viewing geometry.Comment: 5 pages including 2 Postscript figures. To appear in ApJ Letters,
April 200
A Power Comparison of Robust Test Statistics Based On Adaptive Estimators
Seven test statistics known to be robust to the combined effects of nonnormality and variance heterogeneity were compared for their sensitivity to detect treatment effects in a one-way completely randomized design containing four groups. The six Welch-James-type heteroscedastic tests adopted either symmetric or asymmetric trimmed means, were transformed for skewness, and used a bootstrap method to assess statistical significance. The remaining test, due to Wilcox and Keselman (2003), used a modification of the well-known one-step M-estimator of central tendency rather than trimmed means. The Welch-James-type test is recommended because for nonnormal data likely to be encountered in applied research settings it should be more powerful than the test presented by Wilcox and Keselman. However, the reverse is true for data that are extremely nonnormal
Robust Modifications of the Levene and O’Brien Tests for Spread
Variants of Levene’s and O’Brien’s procedures not investigated by Keselman, Wilcox & Algina (2008) were examined. Simulations indicate that a new O’Brien variant provides very good Type I error control and is simpler for applied researchers to compute than the method recommended by Keselman, et al
- …